Gaze information has the potential to benefit Human-Computer Interaction (HCI) tasks, particularly when combined with speech. Gaze can improve our understanding of the user intention, as a secondary input modality, or it can be used as the main input modality by users with some level of permanent or temporary impairments. In this paper we describe a multimodal HCI system prototype which supports speech, gaze and the combination of both. The system has been developed for Active Assisted Living scenarios
In natural course, human beings usually make use of multi-sensory modalities for effective communica...
Tracking the gaze of a person has been possible for several decades. Until recently, it was mostly d...
This dissertation introduces the design of a multimodal, adaptive real-time assistive system as an a...
Speech has been used as the foundation for many human/machine interactive systems to convey the user...
This research is about addressing the need to better understand interaction with conversational user...
Pfeiffer T. Gaze-based assistive technologies. In: Kouroupetroglou G, ed. Assistive Technologies and...
Abstract. The promises of multimodal interaction to make interaction more natural, less error-prone ...
International audienceThe promises of multimodal interaction to make interaction more natural, less ...
Abstract- The physically impaired users cannot handle the traditional input devices such as keyboard...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicat...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
Gaze and speech are rich contextual sources of information that, when combined, can result in effect...
Gaze-based interaction lets users to operate computers through eye movement, and promises to especia...
Motor and communication disabilities are common conditions that may implicate restrictions in daily ...
In natural course, human beings usually make use of multi-sensory modalities for effective communica...
Tracking the gaze of a person has been possible for several decades. Until recently, it was mostly d...
This dissertation introduces the design of a multimodal, adaptive real-time assistive system as an a...
Speech has been used as the foundation for many human/machine interactive systems to convey the user...
This research is about addressing the need to better understand interaction with conversational user...
Pfeiffer T. Gaze-based assistive technologies. In: Kouroupetroglou G, ed. Assistive Technologies and...
Abstract. The promises of multimodal interaction to make interaction more natural, less error-prone ...
International audienceThe promises of multimodal interaction to make interaction more natural, less ...
Abstract- The physically impaired users cannot handle the traditional input devices such as keyboard...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicat...
This paper describes an intelligent system that we developed to support affective multimodal human-c...
Gaze and speech are rich contextual sources of information that, when combined, can result in effect...
Gaze-based interaction lets users to operate computers through eye movement, and promises to especia...
Motor and communication disabilities are common conditions that may implicate restrictions in daily ...
In natural course, human beings usually make use of multi-sensory modalities for effective communica...
Tracking the gaze of a person has been possible for several decades. Until recently, it was mostly d...
This dissertation introduces the design of a multimodal, adaptive real-time assistive system as an a...